Neural Networks Trained with the EEM Algorithm: Tuning the Smoothing Parameter

نویسندگان

  • JORGE M. SANTOS
  • JOAQUIM MARQUES DE SÁ
  • LUÍS A. ALEXANDRE
چکیده

The training of Neural Networks and particularly Multi-Layer Perceptrons (MLP's) is made by minimizing an error function usually known as "cost function". In our previous works, we apply the Error Entropy Minimization (EEM) algorithm in classification and its optimized version using, as cost function, the entropy of the errors between the outputs and the desired targets of the neural network. One of the difficulties in implementing the EEM algorithm is the choice of the smoothing parameter, also known as window size, in the Parzen Window probability density function estimation for the computation of the entropy and its gradient. We present here a formula yielding the value of the smoothing parameter, depending on the number of data samples and on the neural network output dimension. Several experiments with real data sets were made in order to show the validity of the proposed formula. Key-Words: Entropy, Parzen, Smoothing Parameter, Cost Function

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Data Classification with Neural Networks and Entropic Criteria

The concept of entropy and related measures has been applied in learning systems since the 1980s. Several researchers have applied entropic concepts to independent component analysis and blind source separation. Several previous works that use entropy and mutual information in neural networks are basically related to prediction and regression problems. In this thesis we use entropy in two diffe...

متن کامل

Optimization of the Error Entropy Minimization Algorithm for Neural Network Classification

One way of using entropy criteria in learning systems is to minimize the entropy of the error between the output of the learning system and the desired targets. In our last work, we introduced the Error Entropy Minimization (EEM) algorithm for neural network classification. There are some sensible aspects in the optimization of the EEM algorithm: the size of the Parzen Window (smoothing paramet...

متن کامل

On the use of back propagation and radial basis function neural networks in surface roughness prediction

Various artificial neural networks types are examined and compared for the prediction of surface roughness in manufacturing technology. The aim of the study is to evaluate different kinds of neural networks and observe their performance and applicability on the same problem. More specifically, feed-forward artificial neural networks are trained with three different back propagation algorithms, ...

متن کامل

A Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network

Abstract   Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...

متن کامل

Prediction of Gain in LD-CELP Using Hybrid Genetic/PSO-Neural Models

In this paper, the gain in LD-CELP speech coding algorithm is predicted using three neural models, that are equipped by genetic and particle swarm optimization (PSO) algorithms to optimize the structure and parameters of neural networks. Elman, multi-layer perceptron (MLP) and fuzzy ARTMAP are the candidate neural models. The optimized number of nodes in the first and second hidden layers of El...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005